Generalized Entropy Power Inequalities and Monotonicity Properties of Information

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inequalities and monotonicity of ratios for generalized hypergeometric function

Abstract. We find two-sided inequalities for the generalized hypergeometric function with positive parameters restricted by certain additional conditions. Our lower bounds are asymptotically precise at x = 0, while the upper bounds are either asymptotically precise or at least agree with q+1Fq((aq+1), (bq);−x) at x = ∞. Inequalities are derived as corollaries of a theorem asserting the monotony...

متن کامل

Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

متن کامل

On Rényi entropy power inequalities

This paper is a follow-up of a recent work by Bobkov and Chistyakov, obtaining some improved Rényi entropy power inequalities (R-EPIs) for sums of independent random vectors. The first improvement relies on the same bounding techniques used in the former work, while the second significant improvement relies on additional interesting properties from matrix theory. The improvements obtained by th...

متن کامل

Entropy power inequalities for qudits

Shannon’s entropy power inequality (EPI) can be viewed as a statement of concavity of an entropic function of a continuous random variable under a scaled addition rule: f ( √ a X + √ 1− a Y) ≥ a f (X) + (1− a) f (Y) ∀ a ∈ [0, 1]. Here, X and Y are continuous random variables and the function f is either the differential entropy or the entropy power. König and Smith [IEEE Trans. Inf. Theory. 60(...

متن کامل

A Formula Relating Entropy Monotonicity to Harnack Inequalities

∣ 2 u dV. This implies in particular that d dt μ(g(t), τ(t)) ≥ 0 with equality exactly for homothetically shrinking solutions of Ricci flow. An important consequence of this entropy formula is a lower volume ratio bound for solutions of Ricci flow on a closed manifold for a finite time interval [0, T ) asserting the existence of a constant κ > 0, only depending on n, T and g(0), such that the i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2007

ISSN: 0018-9448

DOI: 10.1109/tit.2007.899484